A truncated Newton optimization algorithm in meteorology applications with analytic Hessian/vector products

نویسندگان

  • Zhi Wang
  • Ionel Michael Navon
  • X. Zou
  • F. X. Le Dimet
چکیده

A modified version of the truncated-Newton algorithm of Nash ([24], 125], [29]) is presented differing from it only in the use of an exact Hessian vector product for carrying out the large-scale unconstrained optimization required in variational data assimilation. The exact Hessian vector product is obtained by solving an optimal control problem of distributed parameters. (i.e. the system under study occupies a certain spatial and temporal domain and is modeled by partial differential equations) The algorithm is referred to as the adjoint truncated-Newton algorithm. The adjoint truncated-Newton algorithm is based on the first and the second order adjoint techniques allowing to obtain a better approximation to the Newton line search direction for the problem tested here. The adjoint truncated-Newton algorithm is applied here to a limited-area shallow water equations model with model generated data where the initial conditions serve as control variables. We compare the performance of the adjoint truncated-Newton algorithm with that of the original truncated-Newton method [29] and the LBFGS (Limited Memory BFGS) method of Liu and Nocedal [23]. Our numerical tests yield results which are twice as fast as these obtained by the truncated-Newton algorithm and are faster than the LBFGS method both in terms of number of iterations as well as in terms of CPU time.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Numerical Experience with Limited-Memory Quasi-Newton and Truncated Newton Methods

Computational experience with several limited-memory quasi-Newton and truncated Newton methods for unconstrained nonlinear optimization is described. Comparative tests were conducted on a well-known test library [J. on several synthetic problems allowing control of the clustering of eigenvalues in the Hessian spectrum, and on some large-scale problems in oceanography and meteorology. The result...

متن کامل

A class of multi-agent discrete hybrid non linearizable systems: Optimal controller design based on quasi-Newton algorithm for a class of sign-undefinite hessian cost functions

 In the present paper, a class of hybrid, nonlinear and non linearizable dynamic systems is considered. The noted dynamic system is generalized to a multi-agent configuration. The interaction of agents is presented based on graph theory and finally, an interaction tensor defines the multi-agent system in leader-follower consensus in order to design a desirable controller for the noted system. A...

متن کامل

Preconditioning for Hessian-Free Optimization

Recently Martens adapted the Hessian-free optimization method for the training of deep neural networks. One key aspect of this approach is that the Hessian is never computed explicitly, instead the Conjugate Gradient(CG) Algorithm is used to compute the new search direction by applying only matrix-vector products of the Hessian with arbitrary vectors. This can be done efficiently using a varian...

متن کامل

Speeding-Up Convergence via Sequential Subspace Optimization: Current State and Future Directions

This is an overview paper written in style of research proposal. In recent years we introduced a general framework for large-scale unconstrained optimization – Sequential Subspace Optimization (SESOP) and demonstrated its usefulness for sparsity-based signal/image denoising, deconvolution, compressive sensing, computed tomography, diffraction imaging, support vector machines. We explored its co...

متن کامل

Dynamic scaling based preconditioning for truncated Newton methods in large scale unconstrained optimization

This paper deals with the preconditioning of truncated Newton methods for the solution of large scale nonlinear unconstrained optimization problems. We focus on preconditioners which can be naturally embedded in the framework of truncated Newton methods, i.e. which can be built without storing the Hessian matrix of the function to be minimized, but only based upon information on the Hessian obt...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Comp. Opt. and Appl.

دوره 4  شماره 

صفحات  -

تاریخ انتشار 1995